Interior-Point Gradient Methods with Diagonal-Scalings for Simple-Bound Constrained Optimization
نویسنده
چکیده
In this paper, we study diagonally scaled gradient methods for simple-bound constrained optimization in a framework almost identical to that for unconstrained optimization, except that iterates are kept within the interior of the feasible region. We establish a satisfactory global convergence theory for such interior-point gradient methods applied to Lipschitz continuously differentiable functions without any further assumption. Moreover, a strong convergence result is obtained for a class of so-called L-nonlinear functions introduced in this paper which includes virtually all nonlinear functions that do not contain linear pieces.
منابع مشابه
A full Nesterov-Todd step interior-point method for circular cone optimization
In this paper, we present a full Newton step feasible interior-pointmethod for circular cone optimization by using Euclidean Jordanalgebra. The search direction is based on the Nesterov-Todd scalingscheme, and only full-Newton step is used at each iteration.Furthermore, we derive the iteration bound that coincides with thecurrently best known iteration bound for small-update methods.
متن کاملA Numerical Study of Active-Set and Interior-Point Methods for Bound Constrained Optimization
This papers studies the performance of several interior-point and activeset methods on bound constrained optimization problems. The numerical tests show that the sequential linear-quadratic programming (SLQP) method is robust, but is not as effective as gradient projection at identifying the optimal active set. Interiorpoint methods are robust and require a small number of iterations and functi...
متن کاملAn interior-point algorithm for $P_{ast}(kappa)$-linear complementarity problem based on a new trigonometric kernel function
In this paper, an interior-point algorithm for $P_{ast}(kappa)$-Linear Complementarity Problem (LCP) based on a new parametric trigonometric kernel function is proposed. By applying strictly feasible starting point condition and using some simple analysis tools, we prove that our algorithm has $O((1+2kappa)sqrt{n} log nlogfrac{n}{epsilon})$ iteration bound for large-update methods, which coinc...
متن کاملAn Interior Point Algorithm for Solving Convex Quadratic Semidefinite Optimization Problems Using a New Kernel Function
In this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual Interior Point Method (IPM) based on a new kernel function with a trigonometric barrier term. Iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. Although our proposed kernel function is neither a Self-Regular (SR) fun...
متن کاملA path following interior-point algorithm for semidefinite optimization problem based on new kernel function
In this paper, we deal to obtain some new complexity results for solving semidefinite optimization (SDO) problem by interior-point methods (IPMs). We define a new proximity function for the SDO by a new kernel function. Furthermore we formulate an algorithm for a primal dual interior-point method (IPM) for the SDO by using the proximity function and give its complexity analysis, and then we sho...
متن کامل